38 research outputs found

    On Computing Robust N-Finger Force-Closure Grasps of 3D Objects

    Get PDF
    The paper deals with computing frictional forceclosure grasps of 3D objects problem. The key idea of the presented work is the demonstration that wrenches associated to any three non-aligned contact points of 3D objects form a basis of their corresponding wrench space. This result permits the formulation of a new sufficient force-closure test. Our approach works with general objects, modelled with a set of points, and with any number n of contacts (n >= 4). A quality criterion is also introduced. A corresponding algorithm for computing robust force-closure grasps has been developed. Its efficiency is confirmed by comparing it to the classical convexhull method

    A sufficient condition for computing n-finger force-closure grasps of 3D objects

    Get PDF
    We address the problem of computing n-finger force-closure grasps of 3D objects. As 3D force-closure grasps involve 6D wrench space, we use Pl¨ucker coordinates and Grassmann algebra, to demonstrate that wrenches associated to any three non-aligned contact points of 3D objects form a basis of the 6D wrench space. Thus, given non-aligned locations of n - 1 fingers, a 6D basis can be extracted form their wrenches. This permits the formulation of a fast and simple sufficient forceclosure test. The problem is transformed to searching for a set of locations of the nth finger which wrenches can be uniquely expressed as a strictly negative linear combination of the 6D basis. We have implemented the algorithm and confirmed its efficiency by comparing it to the classical convex-hull method

    Handling Objects by Their Handles

    Get PDF
    This paper presents an efficient method to decide robust grasps given new objects using example-based learning. A robust grasp is a stable grasp, suitable for object manipulation. Adaptability to object manipulation is ensured by imitating the human choice of the object grasping component, its handle. Stability is obtained by computing contact points, ensuring force-closure property, on that handle

    A Sufficient Condition for Force-Closure Grasps Synthesis of 3D Objects

    Get PDF
    A Sufficient Condition for Force-Closure Grasps Synthesis of 3D Object

    Learning the natural grasping component of an unknown object

    Get PDF
    A grasp is the beginning of any manipulation task. Therefore, an autonomous robot should be able to grasp objects it sees for the first time. It must hold objects appropriately in order to successfully perform the task. This paper considers the problem of grasping unknown objects in the same manner as humans. Based on the idea that the human brain represents objects as volumetric primitives in order to recognize them, the presented algorithm predicts grasp as a function of the object’s parts assembly. Beginning with a complete 3D model of the object, a segmentation step decomposes it into single parts. Each single part is fitted with a simple geometric model. A learning step is finally needed in order to find the object component that humans choose to grasp it

    Apprentissage de la partie préhensible d'un objet de forme quelconque

    Get PDF
    Apprentissage de la partie préhensible d'un objet de forme quelconqu

    On the Generation of a Variety of Grasps

    Get PDF
    In everyday life, people use a large diversity of hands configurations while reaching out to grasp an object. They tend to vary their hands position/orientation around the object and their fingers placement on its surface according to the object properties such as its weight, shape, friction coefficient and the task they need to accomplish. Taking into account these properties, we propose a method for generating such a variety of good grasps that can be used for the accomplishment of many different tasks. Grasp synthesis is formulated as a single constrained optimization problem, generating grasps that are feasible for the hand’s kinematics by minimizing the norm of the joint torque vector of the hand ensuring grasp stability. Given an object and a kinematic hand model, this method can easily be used to build a library of the corresponding object possible grasps. We show that the approach is adapted to different representations of the object surface and different hand kinematic models

    EMG-Based Analysis of the Upper Limb Motion

    Get PDF
    In a human robot interaction scenario, predicting the human motion intention is essential for avoiding inconvenient delays and for a smooth reactivity of the robotic system. In particular, when dealing with hand prosthetic devices, an early estimation of the final hand gesture is crucial for a smooth control of the robotic hand. In this work we develop an electromyographic (EMG) based learning approach that decodes the grasping intention at an early stage of the reaching to grasping motion, i.e before the final grasp/hand preshape takes place. EMG electrodes are used for recording the arm muscles activities and a cyberglove is used to measure the finger joints during the reach and grasp motion. Results show that we can correctly classify with 90% accuracy for three typical grasps before the onset of the hand pre-shape. Such an early detection of the grasp intention allows to control a robotic hand simultaneously to the motion of subject's arm, hence generating no delay between the natural arm motion and the artificial hand motion
    corecore